491 research outputs found

    Tracking performance for long-lived particles at LHCb

    Full text link
    The LHCb experiment is dedicated to the study of the c−c- and b−b-hadron decays, including long-lived particles such as KsK_s and strange baryons (Λ0\Lambda^0, Ξ−\Xi^-, etc... ). These kind of particles are difficult to reconstruct by the LHCb tracking system since they escape detection in the first tracker. A new method to evaluate the performance of the different tracking algorithms for long-lived particles using real data samples has been developed. Special emphasis is laid on particles hitting only part of the tracking system of the new LHCb upgrade detector.Comment: Proceeding for Connecting the Dots and Workshop on Intelligent Trackers (CTD/WIT 2019

    Radiative bb-baryon decays to measure the photon and bb-baryon polarization

    Full text link
    The radiative decays of bb-baryons facilitate the direct measurement of photon helicity in b→sγb\to s\gamma transitions thus serving as an important test of physics beyond the Standard Model. In this paper we analyze the complete angular distribution of ground state bb-baryon (Λb0\Lambda_{b}^{0} and Ξb−\Xi_{b}^{-}) radiative decays to multibody final states assuming an initially polarized bb-baryon sample. Our sensitivity study suggests that the photon polarization asymmetry can be extracted to a good accuracy along with a simultaneous measurement of the initial bb-baryon polarization. With higher yields of bb-baryons, achievable in subsequent runs of the Large Hadron Collider (LHC), we find that the photon polarization measurement can play a pivotal role in constraining different new physics scenarios.Comment: Typos corrected, reference adde

    A Roadmap for HEP Software and Computing R&D for the 2020s

    Get PDF
    Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.Peer reviewe

    Triggering new discoveries: development of advanced HLT1 algorithms for detection of long-lived particles at LHCb

    No full text
    The work presented in this thesis constitutes a significant contribution to the first high level trigger (HLT1) of the LHCb experiment, based on the Allen project. In Allen, the entire HLT1 sequence of reconstruction algorithms has been designed to be executed on GPU cards. The work in this thesis has contributed to propel the project forward, enabling the LHCb trigger during the Run3, to successfully select real-time events at a frequency of 3030 MHz. An extensive effort has been performed during the Allen development program, leading to the creation of a Allen performance portability layer which enables framework to be executed in several architectures. Furthermore, inside this framework contribution to several key algorithms have been presented. One of these algorithms, termed HybridSeeding, efficiently reconstructs the tracks produced in the SciFi detector (T-tracks). Another algorithm, named VELO-SciFi Matching, building upon the former, allows the reconstruction of long tracks with a momentum precision better than 1%1\%. Additionally, a new algorithm named Downstream has been conceived, developed and incorporated into HLT1 for first time. A fast and efficient search of hits in the UT detector is performed, and a fast neural network (NN) is applied to reject ghost tracks. It allows to reconstruct downstream tracks with an efficiency of 70%70\% and a ghost rate below 20%20\%. This is the first time that a NN is developed for GPUs inside Allen. This new algorithm will allow the selection of long-lived particles at HLT1 level, opening up new opportunities within both the Standard Model and its extensions. Of particular note is its implication in expanding the search scope for exotic long-lived particles, spanning from 100 ps to several nanosecons, a domain unexplored until now by the LHCb experiment. This, in turn, enhances the sensitivity to new particles predicted by theories that include a dark sector, heavy neutral leptons, supersymmetry, or axion-like particles. In addition, the LHCb’s ability to detect particles from the Standard Model, such as Λ\Lambda and Ks{_s}, is greatly augmented, thereby enhancing the precision of analyses involving b and c hadron decays. The integration of the HLT1 selection lines derived from the Downstream algorithm into the LHCb’s real-time monitoring infrastructure will be important for the data taking during Run3 and beyond, and notably for the present alignment and calibration of the UT detector. The precision in measuring observables which are sensitive to physics beyond the Standard Model, such as the rare Λb→Λγ\Lambda_b \to \Lambda\gamma decay channel, will be greatly augmented. In this thesis a study of the measurement of the branching fraction of the Λb→Λγ\Lambda_b \to \Lambda\gamma decay relative to the B→K∗γB \to K{^*}\gamma channel has been performed. The analysis procedure, including selection, reconstruction and background rejection, has been described. A evaluation of the main systematic uncertainties affecting the measurement has been included. It has been concluded that the statistical precision for Run3 will be below 2%2\% as a result of the inclusion of downstream tracks. The measurement of the photon polarisation in these transitions will also benefit from the increase in the yield, reaching a 10%10\% precision in the αγ\alpha\gamma parameter. Measurements of the CP asymmetry in Λb→Λγ\Lambda_b \to \Lambda\gamma decays will also reach higher precision

    Connecting the Dots 2022

    No full text

    Goals and performance of LHCb upgrade II

    No full text

    Standalone track reconstruction and matching algorithms for GPU-based High level trigger at LHCb

    No full text
    The LHCb Upgrade in Run 3 has changed its trigger scheme for a full software selection in two steps. The first step, HLT1, will be entirely implemented on GPUs and run a fast selection aiming at reducing the visible collision rate from 30 MHz to 1 MHz. This selection relies on a partial reconstruction of the event. A version of this reconstruction starts with two monolithic tracking algorithms, the VELO-pixel tracking and the HybridSeeding on Scintillating-Fiber tracker, which reconstructs track segments in standalone sub-detectors. Those segments are then joined through a matching algorithm in order to produce ’long’ tracks, which form the base of the HLT1 reconstruction. We discuss the principle of these algorithms as well as the details of their implementation which allows them to run at a high-throughput configuration. An emphasis is put on the optimizations of the algorithms themselves in order to take advantage of the GPU architecture. Finally, results are presented in the context of the LHCb performance requirements for Run 3

    Radiative B Decays at LHCb

    No full text
    Rare radiative B decays are sensitive probes of New Physics through the study of branching fractions, CP asymmetries and other observables related to the photon polarization. The LHCb experiment has performed several measurements with radiative B decays. These results provide constraints on predictions of models beyond the Standard Model, and are at present key to understanding the nature of flavor physics

    Effect of the high-level trigger for detecting long-lived particles at LHCb

    No full text
    International audienceLong-lived particles (LLPs) show up in many extensions of the Standard Model, yet are challenging to search for with current detectors, due to their very displaced vertices. This article evaluates the ability of the trigger algorithms used in the LHCb experiment to detect long-lived particles, and work to adapt them in order to enhance the sensitivity of this experiment to undiscovered long-lived particles. A model with a Higgs portal to a dark sector is tested, and the sensitivity reach is discussed. In the LHCb tracking system, the farthest tracking station from the collision point is the Scintillating fiber tracker, the SciFi detector. One of the challenges in the track reconstruction is to deal with the large amount and combinatorics of hits in this detector. A dedicated algorithm has been developed to cope with the large data output. When fully implemented, this algorithm would greatly increase the available statistics for any long-lived particle search in the forward region, and would additionally improve the sensitivity of analyses dealing with Standard Model particles of large lifetime, such as Ks or Lambda hadrons
    • 

    corecore